MTFH: A Matrix Tri-Factorization Hashing Framework for Efficient Cross-Modal Retrieval

نویسندگان

چکیده

Hashing has recently sparked a great revolution in cross-modal retrieval because of its low storage cost and high query speed. Recent hashing methods often learn unified or equal-length hash codes to represent the multi-modal data make them intuitively comparable. However, such representations could inherently sacrifice their representation scalability from different modalities may not have one-to-one correspondence be encoded more efficiently by unequal lengths. To mitigate these problems, this paper exploits related relatively unexplored problem: encode heterogeneous with varying lengths generalize various challenging scenarios. end, generalized flexible framework, termed Matrix Tri-Factorization (MTFH), is proposed work seamlessly settings including paired unpaired data, equal length encoding More specifically, MTFH an efficient objective function flexibly modality-specific settings, while synchronously learning two semantic correlation matrices semantically correlate for As result, derived are meaningful tasks. Extensive experiments evaluated on public benchmark datasets highlight superiority under scenarios show competitive performance state-of-the-arts.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Correlation Hashing Network for Efficient Cross-Modal Retrieval

Due to the storage and retrieval efficiency, hashing has been widely deployed to approximate nearest neighbor search for large-scale multimedia retrieval. Cross-modal hashing, which improves the quality of hash coding by exploiting the semantic correlation across different modalities, has received increasing attention recently. For most existing cross-modal hashing methods, an object is first r...

متن کامل

Supervised Matrix Factorization for Cross-Modality Hashing

Matrix factorization has been recently utilized for the task of multi-modal hashing for cross-modality visual search, where basis functions are learned to map data from different modalities to the same Hamming embedding. In this paper, we propose a novel cross-modality hashing algorithm termed Supervised Matrix Factorization Hashing (SMFH) which tackles the multi-modal hashing problem with a co...

متن کامل

Self-Supervised Adversarial Hashing Networks for Cross-Modal Retrieval

Thanks to the success of deep learning, cross-modal retrieval has made significant progress recently. However, there still remains a crucial bottleneck: how to bridge the modality gap to further enhance the retrieval accuracy. In this paper, we propose a self-supervised adversarial hashing (SSAH) approach, which lies among the early attempts to incorporate adversarial learning into cross-modal ...

متن کامل

Pairwise Relationship Guided Deep Hashing for Cross-Modal Retrieval

With benefits of low storage cost and fast query speed, crossmodal hashing has received considerable attention recently. However, almost all existing methods on cross-modal hashing cannot obtain powerful hash codes due to directly utilizing hand-crafted features or ignoring heterogeneous correlations across different modalities, which will greatly degrade the retrieval performance. In this pape...

متن کامل

HashGAN: Attention-aware Deep Adversarial Hashing for Cross Modal Retrieval

As the rapid growth of multi-modal data, hashing methods for cross-modal retrieval have received considerable attention. Deep-networks-based cross-modal hashing methods are appealing as they can integrate feature learning and hash coding into end-to-end trainable frameworks. However, it is still challenging to find content similarities between different modalities of data due to the heterogenei...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Pattern Analysis and Machine Intelligence

سال: 2021

ISSN: ['1939-3539', '2160-9292', '0162-8828']

DOI: https://doi.org/10.1109/tpami.2019.2940446